Ph. D. Research Proposal Iterative Processes for Solving Nonlinear Operator Equations
نویسنده
چکیده
An important and perhaps interesting topic in nonlinear analysis and convex optimization concerns solving inclusions of the form 0 ∈ A(x), where A is a maximal monotone operator on a Hilbert space H. Its importance in convex optimization is evidenced from the fact that many problems that involve convexity can be formulated as finding zeros of maximal monotone operators. For example, convex minimizations and convex-concave mini-max problems, to mention but a few, can be formulated in this way. In particular, the subdifferential of a proper, convex and lower semi-continuous (lsc) function f , ∂f , is a maximal monotone operator and a point p ∈ H minimizes f if and only if 0 ∈ ∂f(p). One of the most powerful and versatile solution techniques for solving variational inequalities, convex minimizations, and convex-concave minimax (saddle-point) problems is the proximal point algorithm (PPA).
منابع مشابه
A new iteration method for solving a class of Hammerstein type integral equations system
In this work, a new iterative method is proposed for obtaining the approximate solution of a class of Hammerstein type Integral Equations System. The main structure of this method is based on the Richardson iterative method for solving an algebraic linear system of equations. Some conditions for existence and unique solution of this type equations are imposed. Convergence analysis and error bou...
متن کاملSolving systems of nonlinear equations using decomposition technique
A systematic way is presented for the construction of multi-step iterative method with frozen Jacobian. The inclusion of an auxiliary function is discussed. The presented analysis shows that how to incorporate auxiliary function in a way that we can keep the order of convergence and computational cost of Newton multi-step method. The auxiliary function provides us the way to overcome the singul...
متن کاملAN ITERATIVE METHOD WITH SIX-ORDER CONVERGENCE FOR SOLVING NONLINEAR EQUATIONS
Modification of Newtons method with higher-order convergence is presented. The modification of Newtons method is based on Frontinis three-order method. The new method requires two-step per iteration. Analysis of convergence demonstrates that the order of convergence is 6. Some numerical examples illustrate that the algorithm is more efficient and performs better than classical Newtons method and ...
متن کاملSeventh-order iterative algorithm free from second derivative for solving algebraic nonlinear equations
متن کامل
A matrix LSQR algorithm for solving constrained linear operator equations
In this work, an iterative method based on a matrix form of LSQR algorithm is constructed for solving the linear operator equation $mathcal{A}(X)=B$ and the minimum Frobenius norm residual problem $||mathcal{A}(X)-B||_F$ where $Xin mathcal{S}:={Xin textsf{R}^{ntimes n}~|~X=mathcal{G}(X)}$, $mathcal{F}$ is the linear operator from $textsf{R}^{ntimes n}$ onto $textsf{R}^{rtimes s}$, $ma...
متن کامل